Association for Academic Language and Learning |
Journal of Academic Language & Learning
Vol. 10, No. 1, 2016, A223-A236. ISSN 1835-5196
Michelle Cavaleri
Student Learning Support, Navitas Professional Institute, Sydney NSW 2000, Australia
Email: michelle.cavaleri@acap.edu.au
Saib Dianati
Student Learning Centre, Flinders University, Adelaide SA 5042, Australia
Email: saib.dianati@flinders.edu.au
(Received 29 October 2015; Published online 30 January, 2016)
Feedback on students‟ writing is considered an instrumental part of the aca- demic advising process. However, due to the time constraints of the student- adviser interaction, Academic Language and Learning (ALL) advisers may find it difficult to provide comprehensive feedback to students regarding their grammatical mistakes. One solution is to utilise online grammar check- ing tools as a complement to feedback from an adviser. These tools can save advisers‟ time and resources while at the same time promote greater self- directed learning and foster students‟ self-efficacy. In spite of this, many Australian higher education institutions have overlooked this intersection be- tween grammar support and online automated technology. This paper pre- sents an overview of Grammarly, a popular online grammar checking web- site. In addition, this paper provides preliminary results of an evaluation of Grammarly by students at two Navitas colleges, the Australian College of Applied Psychology (ACAP) and Navitas College of Public Safety (NCPS). The students‟ survey responses are analysed against Davis‟ (1989) Technol- ogy Acceptance Model (TAM), which offers a conceptual framework for predicting the acceptability and use of a technology. The results reveal that students perceive Grammarly as useful and easy to use, and students report- ed that Grammarly improved their writing and understanding of grammar rules.
Most academic language and learning (ALL) advisers would agree that students‟ knowledge of grammar and punctuation is sketchy at best. However, a command of these basic skills is essen- tial for quality writing and success in academic contexts (Narita, 2012). Despite evidence of the positive impact of feedback on grammar, advisers in learning centres across Australia are time- constrained and are limited in the amount of grammar correction they are willing, or able, to do. Advisers may feel that it is not their responsibility to provide detailed grammatical feedback on students‟ papers, or they may not feel confident that they have the „know-how‟ to explain com- plex grammatical rules (Jones, Myhill, & Bailey, 2013). Even if an adviser is willing and/or able to provide feedback on grammar, he or she may not have the time to provide comprehensive
A-223 © 2016 M. Cavaleri & S. Dianati
grammatical feedback to students during the limited time of a student consultation session, par- ticularly when other writing issues need attention. Consequently, many students, both native and non-native English speakers, are often in dire need of greater grammatical editing and proof- reading support than what the institution is willing or able to offer.
One solution to this problem is to rely more on self-access materials, such as online grammar checkers. While grammar books and paper-based exercises are portable, they lack the direct in- teractivity with students which online grammar checkers can provide. Grammar checkers, which are freely or commercially available, can automatically recognise and provide advice about grammatical errors in writing. With the developments in artificial intelligence, algorithmic ap- plications and natural language processing, several grammar checkers available on the market claim to offer effective and efficient feedback and suggestions on students‟ grammar, while fur- ther promoting students‟ self-regulation strategies.
This paper examines a popular online grammar checking website, Grammarly, and aims to un- derstand the acceptance and use of Grammarly among higher education students against the framework of the Technology Acceptance Model (TAM).
Grammatical accuracy is critical to quality academic writing as it helps the writer express ideas clearly, accurately and precisely. Academic texts are expected to follow recognised English grammar conventions, such as accurate sentence structure, correct subject-verb agreement, con- sistent and appropriate tense, and correct use of articles. However, many undergraduate students are still on a trajectory of development in terms of their writing, and their linguistic choices may not always be accurate or successful (Myhill, 2009). At the sentence level, students may have difficulties with structures that are difficult to segment, such as constructions without function words or with ambiguous function words, as well as with structures that place a heavy burden on short-term memory, such as interruptions and long subject-noun phrases (Perera, 1984). Cof- fin et al. (2005) state that common grammatical errors in student writing also include not putting a main verb in each sentence, lack of pronoun agreement in sentences, ambiguous use of pro- nouns, and inconsistent use of tenses, as well as problems with apostrophe usage. Myhill (2009) adds that characteristics of more limited linguistic development include overdependence on co- ordination, difficulty managing ideas over long sentences, and lapses in coherence. Students from a non-English speaking background often have significant difficulties with some aspects of English grammar that are distinct from the problems of native English speakers. These in- clude the use of articles (a, the), word order, word formation, selection of prepositions (on, at, in, etc.), omission of the relative pronoun and omission of plural “s” (Clerehan & Moore, 1995; Neumann, 1985). It is important that students have strategies for learning grammar rules and checking their work, as they may lose marks because they have neglected to follow English grammar conventions.
It is well established that feedback is useful for conscious learning of language and many stud- ies support this claim. For example, Bitchener (2008) found that students studying English as a second language (ESL) who received feedback on their use of articles (a and the) within a piece of writing outperformed those in a control group, and this level of performance was retained two months later. Ferris and Roberts' (2001) study examined a wider range of linguistic error categories and provided evidence of significant positive effects for groups of students receiving feedback compared to a group that did not receive any feedback. They measured 72 ESL uni- versity students‟ abilities to revise their texts based on comments relating to five different error categories (verb errors, noun ending errors, article errors, word choice errors and sentence struc- ture errors), and the success ratios of the revisions ranged from 47% (sentence structure errors) to 60% (article errors) (Ferris & Roberts, 2001). Similarly, a study involving high school stu- dents conducted by Jones et al. (2013) found that the teaching of grammar in relation to the writing being studied had an overall positive effect on students‟ writing, and found it particular- ly benefitted more able writers. These studies show that feedback on grammar can have a posi- tive impact on writing.
A popular alternative or complement to teacher feedback on language is computer-based meth- ods. AbuSeileek (2009) argues that computer-based methods are an improvement over non- computer based methods as they provide a greater amount of feedback and present more indi- vidualised material, which makes it easier for each learner to process it at his or her own pace. He also argues that because the learner can access help individually, it reduces anxiety and pro- motes a more relaxed atmosphere for learning. There are also arguments that computer-based methods are ideal for learning higher-level language skills. For example, Garrett (2009) main- tains that explanations of advanced level grammatical concepts that involve dynamic, computer- based visuals are highly beneficial for students.
Online grammar checkers, therefore, could be an efficient and effective tool for enhancing grammar accuracy and learning. Word processing programs with built-in spelling and grammar checkers have been around since the mid-80s, but for a long time were little more than a novelty (Major, 1994), and reviews of grammar checkers in the 1990s expressed disappointment at the checkers‟ accuracy (for example, see Pogue, 1993). In recent times, however, they are regarded as a helpful aid rather than a burden (Potter & Filler, 2008), yet educators and students may still overlook the capability of this tool to improve grammar in a relevant and engaging way. Some current popular online grammar checkers include Grammarly, PaperRater, Grammark, After the Deadline and LanguageTool.
Typically, grammar checkers work by scanning through a text and providing immediate feed- back on grammar, spelling, and punctuation errors. Grammar checkers can highlight issues such as subject-verb disagreement, split infinitives, double negatives, run-on sentences and incorrect use of prepositions. If the checker finds an error, it will explain the grammar rule and may also offer a solution which the user can accept or ignore. The checkers also highlight spelling errors and words that may have been confused. Some grammar checkers also offer feedback on style and vocabulary usage. Style is difficult to check because the intricacies of language require ex- tensive artificial intelligence, but some grammar checkers also claim to have this capability. For example, some checkers will flag sentences that are written in the passive voice or indicate that a particular word may have been overused. Hence, many grammar checkers actually claim to do more than just check grammar. An important point to note is that grammar checkers do not claim to teach grammar; they are a tool to bring potential problems to the writer‟s attention.
Despite their growing popularity, research into online grammar checkers is limited. Vernon (2000) conducted a review of the literature of computerised grammar checkers from 1990-2000 and concluded that research on grammar checkers has largely not kept pace with the technology. Since Vernon‟s paper, several studies on grammar checkers have emerged. Burston (2008) in- vestigated the applications, implications, effectiveness, and accuracy of a French online gram- mar checking program called BonPatron and found that out of 335 purposefully incorrect errors, the program detected 296 of them (88%). This was consistent with Nadasdi and Sinclair's (2007) findings who commented that the program is just as effective as teacher corrections. An- other study also examined BonPatronPro and concluded that the program increased linguistic accuracy by “40 times” and also increased engagement (Gauthier, 2013, p. 24). Similarly, re- search by Potter and Fuller (2008) found that the use of English grammar checkers for high school students increased students‟ motivation, engagement and confidence in grammar rules and English language proficiency.
However, like any technology, grammar checkers have limitations and these are well- documented. In the field of computer-assisted language learning (CALL), Gamper and Knapp (2002) concluded that most CALL programs use Natural Language Processing (NLP) with a focus on syntax, and few address semantic, pragmatic and contextual problems. Similarly, grammar checkers may also largely ignore the role of sentence construction in accounting for different ways to describe the same situation, often regarding the differences between various constructions (such as active and passive voice) as a matter of simple syntactic transformations. However, a student may want (or be required to) write in passive form, for example, in the methods section of a laboratory report. Therefore, while some grammar checkers will warn the user about all occurrences of passive voice, the decision whether to use passive voice or not hinges on what the writer wants to do in that particular context. Another issue identified by
Narita (2012) is that erroneous sentences produced by language learners are quite hard to ana- lyse structurally even with the current state-of-the-art grammar checking technology. Hence, students who are in most need of language and grammar support may receive faulty feedback and suggestions from grammar checkers.
What this literature review illustrates is that a user must evaluate a grammar checker when de- ciding whether or not to use it. This may be conceptualised by Davis‟ (1989) Technology Ac- ceptance Model (TAM) which posits that two key factors determine the likelihood of individual accepting and using a new technology (see Figure 1). The first factor is perceived usefulness, which is defined as a person‟s judgment about whether using a particular technology will con- tribute to the attainment of personal goals, such as enhancing performance (Davis, Bagozzi, & Warshaw, 1989). The other element is the perceived ease of use, which refers to the level of effort required to use the particular technology (Davis et al., 1989). A review of the literature shows that TAM is considered a valid and reliable measure to predict the acceptance or adop- tion of new technologies by end-users. It is one of the most frequently employed models for research into new information technology acceptance, and has been applied in various technolo- gy contexts and environments (for example, see Gefen & Straub, 1997; Park, Rhoads, Hou, & Lee, 2014; Park, Nam, & Cha, 2012; Straub, Keil, & Brenner, 1997). Although TAM has been found to be a useful theoretical framework in predicting adoption and use of technology innova- tions, the model has been challenged for its limitations. For example, there are some criticisms that TAM is too simplistic (for example, see Bagozzi, 2007), hence the model has been extend- ed and revised to include additional explanatory variables (for example, see Legris, Ingham, & Collerette, 2003; McFarland & Hamilton, 2006; Venkatesh, 2000; Venkatesh & Davis, 2000).
For the purposes of this study, Davis‟ (1989) original model will be used, as it is beyond the scope of this paper to investigate the large number of variables presented in the later models (such as Venkatesh, 2000; Venkatesh & Davis, 2000). Moreover, the two key factors, perceived usefulness and perceived ease of use, remain as the most important determinants of technology acceptance and use in the subsequent extended models. The following sections of this paper will provide a short overview of a popular grammar checker website, Grammarly, followed by an evaluation of Grammarly based on survey data of 18 undergraduate students in light of the TAM. The overall aim is to understand the acceptance and use of Grammarly among higher ed- ucation students.
Grammarly is touted as the world‟s most accurate English grammar checker. It claims to correct up to ten times more mistakes than popular word processors by providing over 250 grammatical checks and a contextual spell checker (Grammarly, 2015). Grammarly was founded in 2009 by Maz Lytvyn and Alex Shevchenko, who first developed the platform‟s grammar programming. By 2014, Grammarly was ranked 55 in the fastest growing companies index by Deloitte and currently has over four million registered users (Grammarly, 2015).
To use Grammarly, users copy and paste a text into the input box, or upload a document. Grammarly‟s free version provides grammar, punctuation, spelling, sentence structure and style support (see Figure 2). The premium subscription, which costs USD $139.95 a year, checks an additional 150 grammar points and provides plagiarism detection, vocabulary enhancement suggestions and a contextual spelling feature and gives users a score out of 100 (Grammarly, 2015) (see Figure 3). It also provides both „short‟ and „long‟ explanations of each grammar is- sue it addresses and provides corresponding feedback, which often includes examples of both correct and incorrect usages in green and red respectively (see Figure 4). Users can click the suggested correction to apply it to the text, or click „ignore‟ to move on. Users can also simply read through the feedback without needing to accept or ignore each comment. Before reviewing the text, the premium version also asks users to select a paper type such as essay, dissertation, presentation, blog, business document or creative writing to improve the accuracy of the feed- back. For example, Grammarly does not indicate that starting a sentence with a conjunction is an issue when „creative writing‟ is selected, but it does when „academic essay‟ is chosen. On the whole, Grammarly tends to be conservative in its judgments, also advising against using con- tractions, such as hasn’t and can’t, and ending a sentence with a preposition. The premium ver- sion also provides a plugin for Microsoft Office and offers 24/7 email and phone support.
Grammarly also offers licences for K-12 and higher education institutions through Gram- marly@edu. Institutions can purchase either a campus licence, which gives unlimited access to all students and educators on a single campus, or a volume licence, which allows the use of Grammarly by a pre-set group of individual users in blocks of 50 (Grammarly@edu, 2015). Currently, over 600 universities and corporations have licensed the software (Grammarly, 2015), including several Australian higher education institutions. In addition to its flagship product, the grammar checker, Grammarly also offers a suite of free writing resources, includ- ing Grammarly Answers, an online community for writers to ask and answer question about English writing; Grammarly Handbook, an online guide explaining English grammar and style; Grammarly Words, a contextual online thesaurus; and the Grammarly Blog and Facebook community, which provides fans with grammar tips and discussions.
This section of the paper describes a small-scale trial of Grammarly across two Navitas colleg- es: the Australian College of Applied Psychology (ACAP) and Navitas College of Public Safety (NCPS). ACAP offers a range of VET, undergraduate and postgraduate courses in counselling, psychology, social work, social science, case management and youth work. NCPS offers a Bachelor of Criminology and Justice. There are campuses nationally, and the cohort includes a high percentage of online students. Navitas provides academic language and learning support to these students via the Student Learning Support (SLS) department made up of eight ALL staff, including the researcher of the current project.
SLS staff looked into Grammarly in early 2015 when a student enquired whether she could ob- tain a free or discounted premium subscription through Navitas. We asked Grammarly for a quote for a campus licence, which gives unlimited access to all students and instructors on a single campus. For our cohort of 6000 students, it would cost approximately USD $9000 for a one-year subscription. As we had not yet tried Grammarly, we were reluctant to pay that much for the campus licence, so we asked our Grammarly contact if there was another option availa- ble. He offered Navitas discounted individual premium subscriptions for students at the price of USD $45 a year ($3.75 a month) instead of the usual USD $139.95 a year. Grammarly created a personalised link to the page where Navitas students could sign up to the offer. In addition, Grammarly offered free accounts for all staff by providing an access code to use on signup.
At the start of each trimester in 2015, SLS staff posted an announcement with the offer details on each school‟s online „student lounge‟ class space. We also advertised the offer on the SLS Facebook page. In addition, we notified faculty staff of the offer to sign up for a free account and encouraged them to let their students know about the student deal. Grammarly also ran a live webinar for staff to demonstrate how to use the program and answer any questions.
At the end of each trimester, Grammarly provided us with a list of Navitas users who had signed up using the Navitas discounted subscription link. Somewhat surprisingly, only 37 students signed up to Grammarly using the Navitas discounted subscription offer in 2015. However, this number may be inaccurate for several reasons. Firstly, when we advertised the offer in trimester 1, we did not specify that the student had to sign up with their Navitas student email. As Gram- marly ran the search on the Navitas domain names (for example, @my.acap.edu.au), students who had signed up with a different email address (such as a Hotmail address) were not included in the list. Secondly, SLS staff were unable to send an email directly to each student (this is against the colleges‟ protocols), so the notice about the offer was posted on each school‟s online
„student lounge‟ along with many other notices that get posted at the start of term. Consequent- ly, the information may not have been seen by many students. In our consultations and work- shops with students, SLS staff found that many students were unaware of the Grammarly offer.
At the end of each trimester, a link to a Survey Monkey questionnaire was emailed to the stu- dents who had signed up to Grammarly, with the offer to go into the draw for a $30 Westfields gift card if they completed the evaluation. In total, 18 of the 37 students who had signed up (48.6%) responded to the survey. The survey sought to collect information on the students‟ ex- perience with Grammarly and its impact on their writing. In particular, the survey aimed to gather data on Grammarly‟s perceived usefulness and perceived ease of use, as these are two key factors in determining whether an individual will accept and adopt a technology according to the TAM. The survey comprised three sections: (1) student data, which asked students about their qualification and language abilities, (2) Grammarly evaluation, which asked students to rate statements about usability and usefulness, and (3) the impact of Grammarly, which asked students about the effect of Grammarly on their writing quality, confidence and assignment marks.
The survey data revealed that of the 18 students, 14 were enrolled at ACAP and four were en- rolled at NCPS. Four students were studying at Certificate III/Diploma level, 10 at Bachelor
level, and four at Graduate Diploma/Master level. Eight of the students stated that English was not their first language (44.4%).
Students were asked about their thoughts on their writing in general. Students were presented with five statements (see Table 1) and asked to indicate their level of agreement with each statement. Eleven out of 18 respondents (61.1%) „agreed‟ or „strongly agreed‟ that they only needed a proofreading service, and 10 out of 18 respondents (55.6%) „disagreed‟ or „strongly disagreed‟ that their knowledge of English grammar and vocabulary is weak, with only two stu- dents (11.1%) „agreeing‟ with this statement. It appears there was more uncertainty for students around whether they had written correct sentences with nine out of 18 (50%) „agreeing‟ they do not always feel confident with this, and seven (38.9%) „agreeing‟ they had trouble expressing their ideas in writing. Most students felt they understood the feedback they received on their writing.
Strongly disagree | Disagree | Neutral | Agree | Strongly agree | |
I don't need any help with writing in English; I just need a proofreading service | 0 (0%) | 4 (22.2%) | 3 (16.7%) | 5 (27.8%) | 6 (33.3%) |
My knowledge of English grammar and vocabulary is weak | 3 (16.7%) | 7 (38.9%) | 6 (33.3%) | 2 (11.1%) | 0 (0%) |
I don't always feel confident that I have written correct sentences | 1 (5.6%) | 5 (27.8%) | 3 (16.7%) | 9 (50%) | 0 (0%) |
I am fine with English grammar, but I find it diffi- cult to express my ideas in writing | 1 (5.6%) | 6 (33.3%) | 4 (22.2%) | 7 (38.9%) | 0 (0%) |
I don't always understand the feedback I get in my writing | 3 (16.7%) | 7 (38.9%) | 4 (22.2%) | 3 (16.7%) | 1 (5.6%) |
More than half the respondents said they used Grammarly „once or twice a month‟ (55.6%) which may coincide with assessment task deadlines. Of the remaining respondents, two stated they used Grammarly „every day‟, four stated „once or twice a week‟ and two stated „less than once a month or hardly ever‟. Of the 18 respondents, one student stated they take up „all‟ the suggestions, 13 take up „most‟, three take up „about half‟, and one takes up „some‟.
Respondents were asked to rate on a scale of 0 - 5 Grammarly‟s usefulness and ease of use, the two key principles of the TAM. In terms of usefulness, 15 of the 18 students (83.3%) rated Grammarly a 4 or 5, with 5 being „extremely useful‟. One student rated Grammarly as „not use- ful at all‟ (see Table 2). For ease of use, 17 of the 18 students (94.4%) rated Grammarly a 4 or 5, with 5 being „extremely easy‟ (see Table 3). These results suggest that Grammarly rates highly in the context of the TAM.
Not useful at all 0 | 1 | 2 | 3 | 4 | Extremely useful 5 |
1 (5.6%) | 1 (5.6%) | 1 (5.6%) | 0 (0%) | 6 (33.3%) | 9 (50%) |
Not easy at all 0 | 1 | 2 | 3 | 4 | Extremely easy 5 |
0 (0%) | 1 (5.6%) | 0 (0%) | 0 (0%) | 7 (38.9%) | 10 (55.6%) |
Student were also asked in what ways they found Grammarly helpful. Students were presented with four statements (see Table 4) and asked to indicate their level of agreement with each statement. Fifteen of the 18 respondents (83.3%) „agreed‟ or „strongly agreed‟ that Grammarly gave detailed feedback, and 15 out of 18 (83.3%) „agreed‟ or „strongly agreed‟ that Grammarly made helpful suggestions, although two students (11.1%) „disagreed‟. Thirteen students (72.2%)
„agreed‟ or „strongly agreed‟ that the explanations were good, while four students selected „neu- tral‟ and one selected „disagree‟. Thirteen of the 18 students (72.2%) „agreed‟ or „strongly agreed‟ that Grammarly helped them understand grammar rules while five students (27.8%) chose „neutral‟.
Strongly disagree | Disagree | Neutral | Agree | Strongly agree | |
Grammarly gives detailed feedback | 0 (0%) | 0 (0%) | 3 (16.7%) | 12 (66.7%) | 3 (16.7%) |
Grammarly makes helpful suggestions for improving my work | 0 (0%) | 2 (11.1%) | 1 (5.6%) | 10 (55.6%) | 5 (27.8%) |
Grammarly gives good ex- planations about my errors | 0 (0%) | 1 (5.6%) | 4 (22.2%) | 9 (50%) | 4 (22.2%) |
Grammarly has helped me understand grammar rules | 0 (0%) | 0 (0%) | 5 (27.8%) | 10 (55.6%) | 3 (16.7%) |
Students were also asked about what they disliked about Grammarly. Students were presented with four statements (see Table 5) and asked to indicate their level of agreement with each statement. Four of the 18 respondents (22.2%) „agreed‟ that the feedback was not always help- ful. Eight students (44.4%) „agreed‟ or „strongly agreed‟ they did not agree with some of the suggestions. Two students „agreed‟ that they could not understand the explanations. One student
„agreed‟ that they had technical issues with Grammarly. There was an „other‟ field that allowed students to put in their own response, and one student wrote that “It was American grammer [sic] so this was frustrating”.
Strongly disagree | Disagree | Neutral | Agree | Strongly agree | |
The feedback is not always helpful | 3 (16.7%) | 6 (33.3%) | 5 (27.8%) | 4 (22.2%) | 0 (0%) |
I do not agree with some of the suggestions | 1 (5.6%) | 4 (22.2%) | 5 (27.8%) | 6 (33.3%) | 2 (11.1%) |
I cannot understand the explanations | 4 (22.2%) | 9 (50%) | 3 (16.7%) | 2 (11.1%) | 0 (0%) |
I have technical issues with Grammarly | 8 (44.4%) | 7 (38.9%) | 2 (11.1%) | 1 (5.6%) | 0 (0%) |
In the third section of the survey, the students were asked to consider the impact of Grammarly on their writing. Sixteen of the 18 respondents stated it had a „positive impact‟ on their writing while two said „no impact at all‟. Students were also asked to consider whether Grammarly had given them more confidence in their writing. Fourteen of the 18 respondents stated „yes‟ while three stated „no‟ and one stated „unsure‟. Students were also asked whether they thought Gram- marly helped them get a better mark on their assignments. Nine stated „yes‟, three stated „no‟ and six were „unsure‟.
Finally, students were asked for further comments about Grammarly in their own words. Most of the comments described Grammarly as useful, helpful and easy to use:
“a very useful tool” “helpful and informative”
“positive and pleasant experience. Very user friendly”
Some students showed they were using it thoughtfully and critically, and could see its value:
“Very useful, I may not choose to make the changes it suggests, but find thinking about it very useful”
“I only use Grammarly for proof reading and while it has found several mis- takes I have missed, they were very minor issues and could probably have been noticed if I took better care in reading my work.”
One student commented on the significant impact on his/her marks:
“I find it extremely helpful. I have seen a massive upturn in my marks after using Grammarly for my academic writing.”
However, some students identified difficulties with it:
“It's okay, they suggested something that made no sense”
“I found Grammarly is not very reliable, it gives feedback that doesn't make any sense and I become [sic] very confused after using it to review my work.”
“The grammer [sic] is American so I still had to consider Australian spelling & grammer [sic]. The site was hard to navigate also so I won't use it again.”
“Grammarly should be adapted to an Australian dictionary”
Overall, students reported that Grammarly was helpful and easy to use. As is evident from Table 2 and 3, the factors „usefulness‟ and „ease of use‟ were both evaluated as positive by more than
80% of the students. According to the TAM, because those students found Grammarly useful and easy to use, it is likely they will continue to use Grammarly.
With regards to perceived usefulness, most students reported that they found the suggestions helpful for improving the particular paper they had submitted to Grammarly and half felt it helped them achieve a better mark. In the open-ended comments, 10 of the 18 students used the words “helpful” or “useful”, and, as mentioned, one student commented that she had noticed “a massive upturn in my marks after using Grammarly for my academic writing” which highlights that he or she could see the immediate benefits of Grammarly.
The survey results also suggest that there may be longer-term benefits for students as well. Most students felt that the explanations had helped them understand grammar rules. This indicates that Grammarly may be useful for learning about grammar, which may transfer to future pieces of writing. Therefore, Grammarly may, in fact, provide extra opportunities for language learning that is individualised and self-directed, which aligns with AbuSeileek's (2009) findings. Moreo- ver, 14 of the students surveyed (77.8%) felt that it positively influenced their writing confi- dence, which is a similar result to Potter and Fuller (2008) who also found use of grammar checkers increased students‟ motivation, engagement and confidence in grammar rules.
On the other hand, the student‟s comment that “I only use Grammarly for proof reading and while it has found several mistakes I have missed, they were very minor issues and could proba- bly have been noticed if I took better care in reading my work” highlights that Grammarly may not be as useful if careful proofreading and revision is undertaken before submitting the paper to Grammarly. Having said that, Grammarly has highlighted to this student that careful proofread- ing is important, hence, Grammarly may help students to recognise that proofreading is a key task in the writing process and illuminate the kind of errors students should be looking for.
Another factor that may limit the extent of Grammarly‟s usefulness is that students felt some of the recommendations were flawed or hard to understand. However, the students also showed an awareness of these limitations with most students choosing not to accept all the suggested cor- rections. One student commented that it was still a helpful process: “I may not choose to make the changes it suggests, but find thinking about it very useful”. In this student‟s case, the feed- back from Grammarly led to reflection about grammar that may not have occurred otherwise. The fact that some of Grammarly‟s suggestions are flawed or that students do not feel they want to take up the comment highlights that students need to be discerning about what suggestions to take up, so Grammarly may benefit more able writers. This agrees with the findings of Jones et al. (2013) who found that their grammar intervention benefited stronger writers more than weaker writers, and suggested that this was because more able writers “have clearer communi- cative and rhetorical intentions for their writing than less able writers, enabling them to make more appropriate use of their grammatical understanding to shape text appropriately” (p. 1256).
With regards to the second key factor of the TAM, perceived ease of use, 17 of the 18 students (94.4%) rated Grammarly a 4 or 5, with 5 being „extremely easy‟. Only one student „agreed‟ that they had technical issues with Grammarly. However, two students found Grammarly diffi- cult to use as they noticed that the “The grammer [sic] is American so I still had to consider Australian spelling & grammer [sic]. Grammarly claims to identify automatically whether the text is in British or American English, so this feature may not be as sensitive as it claims, or it may not work if the text contains a mixture of British and American English. Because the user is unable to choose the dictionary manually, it affects both its usefulness and ease of use. The same student also stated that “The site was hard to navigate also so I won't use it again” which clearly shows the link between the perceived ease of use and acceptance of the technology, as suggested by the TAM.
A student also reported that Grammarly “suggested something that made no sense” and another stated that “it gives feedback that doesn't make any sense and I become [sic] very confused after using it to review my work”. While it is unclear if these comments refer to the usefulness of the comment or the ease of understanding the comment, it is possible that feedback and explana- tions from Grammarly may be overwhelming or confusing for some students. In other words, Grammarly‟s use of metalinguistic terminology may be a barrier rather than a support for some
students. This idea aligns with the findings of Jones et al. (2013) who reported that for some students “the level of conceptual thinking required to understand grammatical concepts and transfer that learning into their writing was too high a cognitive challenge” (p. 1256). Therefore, advisers may need to work initially with students to unpack some of the feedback and sugges- tions from Grammarly.
In light of the survey data, it seems likely that most of the students would continue to use Grammarly as it is generally perceived as useful and easy to use. A follow-up survey is planned to see whether these students do in fact continue to use Grammarly in subsequent terms. It would also be beneficial to explore Grammarly against one of the extensions of the TAM. For example, Venkatesh and Davis‟ (2000) extension of the TAM also considers how social influ- ence processes such as subjective norm, voluntarity and image also impact on technology ac- ceptance, and McFarland and Hamilton's (2006) TAM extension also includes contextual varia- bles such as prior experience, other's use, computer anxiety, system quality, task structure, and organisational support. It is also important to note that study was very small scale with only 18 student participants, so similar studies with a larger sample are needed to corroborate these find- ings. In addition, more research into the accuracy of the recommendations as well as adviser and educator perceptions of the technology would benefit advisers, educators and students when deciding how to use these tools.
Research on online grammar checkers is limited, so this paper aimed to provide insights into the usefulness and perception of Grammarly, a popular online grammar checker. Student evalua- tions of Grammarly were generally in agreement that it is useful and easy to use, and students stated that Grammarly increased their confidence in writing and their understanding of gram- matical concepts. The findings suggest that students can benefit from Grammarly‟s individual instruction and the self-access nature of the tool. It can complement ALL practitioners‟ feed- back to students and can mitigate issues such as lack of time to address grammatical problems in student writing, leaving more time for advisers to focus on higher-level writing concerns. Although Grammarly is quite sophisticated, users should carefully consider each suggestion in light of the sometimes flawed recommendations to writers. Advisers and students should ap- proach grammar checkers critically, and influencing factors such as usefulness and ease of use as outlined in the Technology Acceptance Model should be considered.
Abuseileek, A. F. (2009). The effect of using an online-based course on the learning of grammar inductively and deductively. ReCALL, 21(03), 319-336.
Bagozzi, R. P. (2007). The legacy of the Technology Acceptance Model and a proposal for a paradigm shift. Journal of the Association for Information Systems, 8(4), 244-254.
Bitchener, J. (2008). Evidence in support of written corrective feedback. Journal of Second Language Writing, 17(2), 102-118.
Burston, J. (2008). Bon Patron: An online spelling, grammar, and expression checker. Calico Journal, 25(2), 337-347.
Clerehan, R., & Moore, T. (1995). Analysing the “text on the page”: Some directions from applied linguistics. In M. Garner, K. Chanock, & R. Clerehan (Eds.), Academic skills advising: Towards a discipline (pp. 65-75). Melbourne, Australia: Victorian Language and Learning Network.
Coffin, C., Curry, M. J., Goodman, S., Hewings, A., Lillis, T., & Swann, J. (2005). Teaching academic writing: A toolkit for higher education. London, England: Routledge.
Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of infor- mation technology. MIS Quarterly, 13(3), 319-339.
Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User acceptance of computer technology: A comparison of two theoretical models. Management Science, 35(8), 982- 1003.
Ferris, D., & Roberts, B. (2001). Error feedback in L2 writing classes: How explicit does it need to be? Journal of Second Language Writing, 10(3), 161-184.
Gamper, J., & Knapp, J. (2002). A review of intelligent CALL systems. Computer Assisted Language Learning, 15(4), 329-342.
Garrett, N. (2009). Computer‐assisted language learning trends and issues revisited: Integrating innovation. The Modern Language Journal, 93(s1), 719-740.
Gauthier, M. (2013). Anglophone high school boys‟ engagement and achievement in editing their French writing using the BonPatronPro. Journal of Classroom Research in Literacy, 6, 24-35.
Gefen, D., & Straub, D. W. (1997). Gender differences in the perception and use of e-mail: An extension to the technology acceptance model. MIS Quarterly, 389-400.
Grammarly. (2015). Retrieved July 18, 2015, from https://www.grammarly.com
Grammarly@edu. (2015). Retrieved August 2, 2015, from http://www.grammarly.com/edu/ Jones, S., Myhill, D., & Bailey, T. (2013). Grammar for writing? An investigation of the effects
of contextualised grammar teaching on students‟ writing. Reading and Writing, 26(8), 1241-1263.
Krashen, S. (1982). Principles and practice in second language acquisition. Oxford, UK: Per- gamon.
Legris, P., Ingham, J., & Collerette, P. (2003). Why do people use information technology? A critical review of the technology acceptance model. Information and Management, 40(3), 191-204.
Major, M. J. (1994). Spelling, grammar and style go electronic. Managing Office Technology, 39(4), 18-21.
McFarland, D. J., & Hamilton, D. (2006). Adding contextual specificity to the technology acceptance model. Computers in Human Behavior, 22(3), 427-447.
Myhill, D. (2009). Becoming a designer: Trajectories of linguistic development. In R. Beard, D. Myhill, M. Nystrand, & J. Riley (Eds.), The SAGE handbook of writing development (pp. 402-415). London, England: SAGE Publications.
Nadasdi, T., & Sinclair, S. (2007). Anything I can do, CPU can do better: A comparison of hu- man and computer grammar correction for L2 writing using BonPatron.com. Un- published manuscript. Retrieved July 6, 2015 from https://www.ualberta.ca/~tnadasdi/Dublin.htm
Narita, M. (2012). Developing a corpus-based online grammar tutorial prototype. Language Teacher, 36(5), 23-31.
Neumann, R. (1985). English language problems and university students from a non-English speaking background. Higher Education Research and Development, 4(2), 193-202.
Park, N., Rhoads, M., Hou, J., & Lee, K. M. (2014). Understanding the acceptance of teleconferencing systems among employees: An extension of the technology acceptance model. Computers in Human Behavior, 39, 118-127.
Park, S. Y., Nam, M. W., & Cha, S. B. (2012). University students' behavioral intention to use mobile learning: Evaluating the technology acceptance model. British Journal of Educational Technology, 43(4), 592-605.
Perera, K. (1984). Children's writing and reading: Analysing classroom language. Oxford, Engliand: Blackwell.
Pogue, D. (1993). Grammar crackers. Macworld, 10(11), 183-186.
Potter, R., & Fuller, D. (2008). My New Teaching Partner? Using the Grammar Checker in Writing Instruction. English Journal, 98(1), 36-41.
Straub, D., Keil, M., & Brenner, W. (1997). Testing the technology acceptance model across cultures: A three country study. Information and Management, 33(1), 1-11.
Venkatesh, V. (2000). Determinants of perceived ease of use: Integrating control, intrinsic mo- tivation, and emotion into the technology acceptance model. Information Systems Re- search, 11(4), 342-365.
Venkatesh, V., & Davis, F. D. (2000). A theoretical extension of the technology acceptance model: Four longitudinal field studies. Management Science, 46(2), 186-204.
Vernon, A. (2000). Computerized grammar checkers 2000: Capabilities, limitations, and peda- gogical possibilities. Computers and Composition, 17(3), 329-349.